# Roberta architecture
Ambert
MIT
An Amharic language model trained based on the Roberta architecture, suitable for various natural language processing tasks.
Large Language Model
Transformers

A
surafelkindu
57
1
Sinhalaberto
This is a relatively small model trained on the deduplicated OSCAR Sinhala dataset, providing foundational support for the low-resource Sinhala language.
Large Language Model Other
S
keshan
34
1
Featured Recommended AI Models